| 1. | What is the total quantity of information entropy in the universe?
|
| 2. | Instead the appropriate quantity to maximise is the " relative information entropy,"
|
| 3. | Indeed, information entropy can be used as an index of qualitative variation.
|
| 4. | The question of the link between information entropy and thermodynamic entropy is a debated topic.
|
| 5. | According to this principle, the distribution with maximal information entropy is the proper one.
|
| 6. | Shannon's information entropy is a much more general concept than statistical thermodynamic entropy.
|
| 7. | Entropy, if considered as information ( see information entropy ), is measured in bits.
|
| 8. | At an everyday practical level the links between information entropy and thermodynamic entropy are not evident.
|
| 9. | The logarithm can also be taken to the natural base in the case of information entropy.
|
| 10. | This is the case of the unbiased bit, the most common unit of information entropy.
|